Medicine & Science in Sports & Exercise
○ Ovid Technologies (Wolters Kluwer Health)
Preprints posted in the last 30 days, ranked by how well they match Medicine & Science in Sports & Exercise's content profile, based on 10 papers previously published here. The average preprint has a 0.06% match score for this journal, so anything above that is already an above-average fit.
Johnson, L. R.; Bond, C. W.; Noonan, B. C.
Show abstract
Background: Quadriceps weakness may reduce sagittal plane shock absorption during landing, shifting load toward the frontal plane and increasing knee abduction moment (KAM), a biomechanical risk factor for anterior cruciate ligament (ACL) injuries. Purpose: The purpose of this study was to evaluate the association between isokinetic quadriceps strength and peak KAM during drop vertical jump landing in adolescent athletes. Study Design: Secondary analysis of previously collected data. Methods: Healthy adolescent athletes completed quadriceps strength testing using an isokinetic dynamometer and a biomechanical assessment during a drop vertical jump task. Quadriceps strength was quantified as peak concentric torque and the peak external KAM was calculated during the landing phase on the dominant limb. Both strength and KAM were normalized to body mass. Linear regression was used to examine the association between normalized quadriceps strength and peak external KAM on the dominant limb. Results: The association between quadriceps strength and peak normalized KAM on the dominant limb was not statistically significant ({beta} = -0.053 (95% CI [-0.137 to 0.030]), F(1,119) = 1.62, R2 = 0.013, p = 0.206). Quadriceps strength explained only 1.3% of the variance in peak KAM, indicating a negligible association between these variables in this cohort. Discussion: Quadriceps strength was not associated with peak normalized KAM during landing, suggesting that frontal-plane knee loading during a drop vertical jump is not meaningfully explained by maximal concentric quadriceps strength alone. KAM appears to be driven more by multi-joint movement strategy and neuromuscular coordination than by the capacity of a single muscle group.
Boukhris, O.; Suppiah, H.; Driller, M. W.
Show abstract
This study compared the effects of a 25-min nap opportunity and a 10-min non-sleep deep rest (NSDR) condition on perceptual, cognitive, and physical performance in physically active young adults. Sixty participants (26 female, 34 male; 22 {+/-} 4 years) were randomly assigned to one of three groups (nap, NSDR, control; n = 20 each). All groups completed identical assessments immediately, 20 min, and 40 min post-intervention. Mixed-effects models, adjusted for sex, prior-night sleep, and weekly physical activity, revealed a significant Group x Time interaction for sleepiness, fatigue, readiness to perform, and handgrip strength (p < 0.05). At 40 min post-intervention, the nap group reported lower fatigue than control and higher readiness to perform than both control and NSDR (p < 0.05). No significant effects were observed for the NSDR condition on perceptual, cognitive, or physical outcomes (p > 0.05). These findings indicate that a short nap can enhance perceived readiness and reduce fatigue after a brief latency period, whereas NSDR did not elicit significant effects under the present conditions.
Stevenson, S.; Driller, M.; Fullagar, H.; Pumpa, K.; Suppiah, H.
Show abstract
BackgroundEmerging research indicates that light exposure may influence sleep quality. Identifying key light-exposure behaviours associated with poor sleep quality in athletes may allow practitioners to efficiently screen for sleep difficulties and prioritise athletes for further assessment. Translating these findings into a practical screening tool could enhance willingness of high-performance professionals to monitor sleep and light exposure in athletes. HypothesisKey predictor variables identified by feature reduction techniques will lead to higher predictive accuracy in determining which light behaviours are associated with poor sleep quality in athletes. Study DesignCross-sectional study. Level of EvidenceLevel 3. Methods121 athletes from varying competitive levels completed questionnaires, including the Light Exposure Behaviour Assessment (LEBA) and Pittsburgh Sleep Quality Index (PSQI). Poor sleep quality was defined using the PSQI cut-off >5. Least absolute shrinkage and selection operator (LASSO) regression identified light exposure variables from the LEBA questionnaire most strongly associated with good and poor sleep quality in athletes. Three models were compared: a full-variable model (23 items), a factor-specific model (Factor 3: screen/device use), and a feature-reduced model (LASSO-selected items). ResultsPhone use before bed, checking phone/watch during the night, were identified as variables of greatest association with poor sleep quality and used for reduced feature set modelling. On an independent test set, the feature-reduced model achieved area under the curve (AUC) 0.83, sensitivity 0.70, and specificity 0.92. ConclusionsOur findings report that phone-related behaviours before and in bed are associated with a higher likelihood of poor sleep quality. These behaviours, combined with the developed nomogram, provide a preliminary, low-burden screening tool to identify athletes who may be experiencing sleep difficulties. The high specificity indicates that athletes flagged by the tool are likely to have genuine poor sleep quality, warranting further assessment to identify underlying causes and appropriate interventions. Clinical RelevanceEducation and interventions focused on light exposure factors were identified as most influencing sleep quality in a multifaceted athletic population and could be prioritised to optimise sleep quality. The developed sleep quality nomogram may be useful as a decision-making tool to improve sleep monitoring practice among practitioners.
Moser, J. D.; Bond, C. W.; Noonan, B. C.
Show abstract
Objectives: Compare Anterior Cruciate Ligament (ACL) Return to Sport after Injury (ACL-RSI) scores over time following ACL reconstruction (ACLR) between male and female patients aged 15 to 25 years with primary ACL injuries and ACL reinjuries. Design: Retrospective cohort design. Setting: Sports physical therapy clinics. Participants: 332 patients aged 15-25 years who underwent ACLR following either primary ACL injury or ACL reinjury, either contralateral or ipsilateral graft reinjury, and had at least one observation of the ACL-RSI. Main Outcome Measures: ACL-RSI score. Results: ACL-RSI scores significantly increased over time post- ACLR (p < .001), males reported significantly higher scores compared to females (p < .001), and patients with contralateral ACL reinjury demonstrated higher scores than those with ipsilateral ACL graft reinjury (p = .006), though there was no difference in scores between patients with primary ACL injury and ACL reinjury. A significant interaction effect of sex and injury status was also observed (p = .009), generally demonstrating that females had lower psychological readiness compared to males across injury statuses. Conclusions: ACL-RSI following ACLR varies based on biological sex and time post-ACLR, though ACL reinjury, independent of the reinjured leg, does not appear to effect scores compared to primary ACL injury.
Johnson, O. S.; Bond, C. W.; Noonan, B. C.
Show abstract
Background: Psychological readiness to return to sport and subjective knee function are critical outcomes following ACL reconstruction (ACLR), yet they do not always progress in parallel. An athlete may demonstrate high subjective knee function but low psychological readiness, suggesting a mental barrier to return, or conversely, report high readiness despite persistent functional limitations, raising concerns of overconfidence and reinjury risk. Understanding how these domains change together during recovery is essential for identifying mismatches that may require targeted intervention. Purpose: The purpose of this study is to examine the relationship between changes in psychological readiness (ACL-RSI) and subjective knee function (IKDC) from early to late recovery following ACLR. Study Design: Secondary analysis of prospectively collected data. Methods: Athletes (N = 48, Age at ACLR = 17.7 {+/-} 1.8 y) aged 15-25 years who underwent ACLR with an ipsilateral autograft, had a pre-injury MARX score > 8, and completed the ACL-RSI and IKDC questionnaires at 3.5 {+/-} 1 and 7 {+/-} 1 months post-ACLR were included. Percent changes in ACL-RSI and IKDC scores between early and late recovery were calculated. Spearman's rank correlation was used to examine the association between changes in psychological readiness and subjective knee function. Significance was set to p < .05. Results: The mean percent change in ACL-RSI was 40.7 {+/-} 57.1% and the mean percent change in IKDC was 24.8 {+/-} 18.1% from 3.5 {+/-} 1 months to 7 {+/-} 1 months post-ACLR. The percent changes in ACL-RSI and IKDC scores from 3.5 {+/-} 1 months to 7 {+/-} 1 months post-ACLR were moderately correlated ({rho} = 0.350 (95% CI [0.089, 0.584]), p = 0.012). Discussion: The main finding of this study was that subjective knee function and psychological readiness to return to sport changed in parallel from 3.5 to 7 months following ACLR. Clinicians can use this information regarding the concordant progression of psychological readiness to return to sport and subjective knee function to personalize ACL rehabilitation for future patients. Overall, clinicians can understand that if psychological readiness improves, subjective knee function will likely improve over the 3.5- to 7-month post-ACLR time frame, and vice versa. Therefore, focusing on both of these components at multiple time points during the recovery process may be influential to ensure the greatest likelihood of returning to sport in athletes following ACLR.
Sakoda, S.; Kajiwara, K.; Shuto, R.; Kumagae, H.; Yokoi, O.; Kawano, K.
Show abstract
ContextClinical assessments of landing mechanics often require complex scoring systems or laboratory-based motion analysis, which can limit feasibility in routine practice. A visually based landing-mechanics score centered on a standardized optimal joint-alignment configuration ("Zero Position") may offer a simple, clinically deployable alternative. ObjectiveTo determine the intra- and inter-rater reliability of a landing mechanics score based on standardized optimal joint alignment at the moment of maximal center-of-mass (COM) descent. DesignCross-sectional reliability study. SettingUniversity athletic training facility. Patients or Other ParticipantsNinety healthy male collegiate athletes. Main Outcome MeasuresLanding mechanics were evaluated using frontal- and sagittal-plane video recordings, with scoring performed on the frame corresponding to maximal COM descent. Five criteria reflecting the standardized joint configuration ("Zero Position") were assessed. Intra- and inter-rater reliability were calculated using Cohens kappa coefficients and Kendalls W. ResultsAll five criteria demonstrated moderate to substantial intra-rater reliability and moderate to almost perfect inter-rater reliability. The total landing-mechanics score showed excellent agreement across all comparisons. The scoring system required minimal training and was feasible to implement using standard video recordings. ConclusionsThe landing-mechanics score centered on the Zero Position demonstrated high reliability and strong clinical feasibility. This simple, visually grounded assessment may support routine clinical screening, injury-risk evaluation, and return-to-sport decision-making. Future research should examine its applicability to single-leg landings and sport-specific high-risk movements.
Babir, F. J.; Marcotte-Chenard, A.; Sandilands, R. E.; Falkenhain, K.; Mulkewich, N.; Islam, H.; McCarthy, S. F.; Richards, D. L.; Madden, K.; Singer, J.; Riddell, M. C.; Jung, M. E.; Gibala, M. J.; Little, J. P.
Show abstract
Aims/hypothesisTo investigate the feasibility and preliminary efficacy of a 12-week remotely-delivered exercise snacks (ES) intervention in adults with type 2 diabetes. MethodsInsufficiently active adults with type 2 diabetes (N=69; 46 females; mean age {+/-} SD: 58{+/-}11 years) were randomized to an ES or mobility/stretching comparator group (CON), which involved 4 x 1-min bouts of either vigorous or low intensity exercise, respectively, on [≥]5 days/week. The primary outcome was feasibility based on adherence. Secondary outcomes included exercise enjoyment (1-7 scale), rating of perceived exertion (RPE; 0-10 scale), heart rate (HR), hemoglobin A1c (HbA1c), blood biomarkers of cardiometabolic health, 30-second sit-to-stand capacity, grip strength, estimated maximal oxygen uptake, and anthropometrics. ResultsWeekly adherence (estimated marginal mean [95% confidence interval]: 18 bouts [16 to 21] for both groups; P=0.99) and total enjoyment (ES: 4.5 [4.1 to 4.8] vs CON: 4.3 [4.0 to 4.7]; P=0.64) were high and not different between groups. Despite higher RPE (5.7 [5.4 to 6.1]) and peak HR (73 [70 to 77] % of age-predicted HR maximum) in ES vs CON (2.0 [1.7 to 2.4] and 61 [58 to 64] % of age-predicted HR maximum, respectively) (all P<0.001), there were no between-group differences in the change in any secondary outcome (all P>0.05) except for greater sit-to-stand capacity in ES after training (between-group effect estimate [95% confidence interval]: 1.9 repetitions [0.3 to 3.4]; P=0.02). Conclusions/interpretationExercise snacks were feasible to perform in the real-world and improved physical capacity to a greater extent than CON in adults living with type 2 diabetes. Trial registrationClinicalTrials.gov ID: NCT06407245 Research in ContextO_ST_ABSWhat is already known about this subject?C_ST_ABSO_LIExercise snacks ([≤]1-min bouts of vigorous exercise spaced out across the day) are a time-efficient and practical approach to promote vigorous exercise and break up sedentary time. C_LIO_LIReal-world exercise snack interventions appear feasible for middle-aged and older adults. C_LI What is the key question?O_LIAre 12 weeks of exercise snacks performed in the real-world feasible for insufficiently active adults living with non-insulin treated type 2 diabetes? C_LI What are the new findings?O_LIExercise snacks are feasible for those living with type 2 diabetes to perform unsupervised in the real-world based on high adherence, enjoyment, and participant retention rates. C_LIO_LIExercise snacks improved 30-second sit-to-stand capacity and reduced waist circumference suggesting enhancements in physical capacity and body composition. C_LI How might this impact on clinical practice in the foreseeable future?O_LIExercise snacks could be utilized to help individuals living with type 2 diabetes build a routine or habit of incorporating small amounts of physical activity into their daily lives. C_LIO_LIThe improved physical capacity observed in the current study could contribute to lower fall risk and greater lower body strength in those with type 2 diabetes as they age. C_LI
Butts, A. F.; Hickey, J. W.; Spitz, G.; Xie, B.; Giesler, L. P.; Evans, L. J.; O'Brien, T. J.; Shultz, S. R.; Wright, B. J.; McDonald, S. J.; O'Brien, W. T.
Show abstract
BACKGROUNDThe recovery from sport-related concussion (SRC) is highly heterogenous, with many individuals experiencing symptoms that persist beyond typical recovery timeframes. The early identification of individuals at risk of prolonged symptoms is therefore critical to inform timely interventions and set realistic recovery expectations. Although acute symptom burden is one predictor of future symptom burden, reliance on self-reported measures may limit objectivity and reduce clinical utility in settings where symptom evaluation may be unreliable. In this prospective cohort study, we evaluated the discriminatory accuracy of the CogState Brief Battery, alone and in combination with the Sport Concussion Assessment Tool (SCAT), to classify Australian football players with SRC from Australian footballers without SRC at 24-hours post-injury/match. Furthermore, we examined whether CogState performance and symptom severity at 24 hours were associated with symptom outcomes at one-week post-injury. Adult amateur Australian football players (n=181) were recruited following SRC (n=109 SRC, 86% male) or after a non-injured match (n=72, 90% male). Participants completed the CogState Brief Battery, SCAT and Rivermead Post Concussion Questionnaire (RPQ) at 24-hours and one-week post-injury or match. Area under the receiver operating characteristic (AUC) analyses quantified the ability of 24-hour CogState task performance and SCAT symptom severity to distinguish SRC from controls. Linear regression models examined associations between CogState performance and symptom severity (SCAT and RPQ), within and across the 24-hour and one-week time points. Additional models evaluated whether combining 24-hour symptom severity assessments with CogState performance improved prediction of one-week symptom burden and symptomatic status. SCAT symptom severity demonstrated excellent discriminatory classification accuracy for SRC versus controls at 24-hours post-injury (AUC [95% CI]: 0.949 [0.916 - 0.981]). CogState task performance showed lower discriminatory accuracy but demonstrated fair classification and prognostic utility (e.g., Identification task AUC [95% CI]: 0.666 [0.582 - 0.750]). CogState performance at 24-hours was significantly associated with overall symptom severity at both 24-hours and one-week, as well as with symptom severity across individual symptom domains. In combined models, 24-hour symptom severity and CogState performance independently contributed to the prediction of symptomatic from asymptomatic individuals at one-week post-SRC (e.g., Identification task AUC [95% CI]: 0.721 [0.606 - 0.835] for classification based on <4 versus [≥]4 symptoms). These findings indicate that CogState performance at 24-hours post-SRC may serve as an objective adjunct to subjective symptom-based reporting, supporting both diagnosis and early prognostication in the clinical evaluation of SRC.
Hopkins-Rosseel, D.; Harris, J.; Aver Bretanha Ribeiro, P.; Bacon, S. L.; Hansen, N.; Hartley, T.; Hebert, A.-A.; E. Kimber, D.; Mabey, B.-J.; Marques Vieira, A.; Prince Ware, S.; Warner, P.; Way, K.; Yeung, C.
Show abstract
Exercise training is a cornerstone of Cardiovascular Rehabilitation (CR) and, as of now, moderate-to-vigorous continuous exercise training (MICT) is the standard. New exercise modalities in the context of CR are constantly being explored to improve patient outcomes. These Canadian Association of Cardiovascular Prevention and Rehabilitation (CACPR) exercise training recommendations provide a synthesis of evidence-informed recommendations from existing documents, including recommendations around High-Intensity Interval training (HIIT). CACPR created a pan-Canadian Exercise Working Group with various knowledge users (e.g., kinesiologists/exercise physiologists, physiotherapists, cardiologists, and patients) with expertise in CR-based exercise, who developed knowledge gap questions related to exercise training based on a literature review and synthesis of all available recommendations. An independent evidence-synthesis team performed a rapid review and meta-analyses to address the questions. The working group used this data to develop relevant recommendations. The final guidelines include 12 recommendations for CR exercise, including nine from previous documents and three new recommendations based on HIIT. The previous recommendations address exercise assessments and prescriptions for CR for various patient profiles. The new recommendations suggest that HIIT can be used to improve exercise capacity in patients with coronary artery disease (CAD), heart failure (HF) or atrial fibrillation. They also state that HIIT is superior to MICT in patients with CAD, that patients with HF should be considered for either HIIT or MICT and that any HIIT interval duration can be used as part of CR. Overall, these recommendations provide guidance for exercise in Canadian CR programs.
Okubo, Y.; Phu, S.; Chaplin, C.; Hicks, C.; Coleman, E.; Humburg, P.; Martinez, P. S.; Lord, S.
Show abstract
BACKGROUNDFall injuries in older adults are devastating and often caused by impaired reactive balance to unexpected trips and slips, which conventional exercise programs do not target. This study examined whether a low-dose perturbation balance training (PBT) program among older adults can improve balance recovery following trips and slips and reduce falls and fall injuries. METHODS111 older adults (65+ years) were randomised into an intervention or control group. The intervention group undertook one weekly PBT session for three weeks on the Trip and Slip Walkway, followed by three-monthly PBT booster sessions over one year, for a total of six sessions. The control group received an educational booklet. Blinded staff assessed laboratory-falls induced by a trip and slip with a safety harness at baseline and one year. Number of falls and fall injuries in daily life were collected weekly for one year. RESULTSCompared to the control group, the intervention group experienced a 26% reduction in laboratory falls at 12 months (RR = 0.74; 95% CI: 0.54, 0.99; P = .040) but not different in number of falls, trip-and slip-encounters in daily life. However, fall-related injuries were reduced by 57% (rate ratio = 0.43; 95% CI: 0.19, 0.94, P = .024) over one year. A reduction in falls occurred within the first three months, with greater benefit among participants who completed at least three training sessions. CONCLUSIONSA low-dose PBT program can improve reactive balance over 12 months and reduced injurious falls by 57%, with benefits likely due to enhanced reactive balance rather than proactive gait strategies. Older adults may require at least three sessions to achieve meaningful fall reduction, with periodic booster sessions to sustain benefits. Incorporating PBT into exercise programs may enhance their efficacy in preventing falls and fall injuries in daily life. Key PointsA low-dose perturbation-based training program (six sessions over 12 months) improved reactive balance at 12 months and reduced injurious falls by 57%. Benefits are likely due to task-specific improvements in reactive balance against trips and slips rather than proactive gait strategies or other risk factors. Incorporating PBT into exercise programs may improve their efficacy in preventing falls and fall injuries in daily life. Why does this paper matter?Falls are the leading cause of injury-related hospitalization and loss of independence in older adults. By targeting reactive balance--an ability neglected by conventional exercise programs--it offers a novel, evidence-based approach to enhance fall prevention and reduce injuries.
Rogan, S.; Farrell, G.; Schlarb, S.; Schlarb, M.; Agarwal, S.; Clijsen, R.
Show abstract
BackgroundThoracic spine mobilization (TSM) has been proposed to influence autonomic nervous system (ANS) activity, yet evidence remains inconsistent and feasibility of standardised protocols is unclear. This study aimed to evaluate whether a randomized TSM protocol can be implemented successfully in healthy participants and to provide preliminary estimates of its effects on heart rate variability (HRV) and heart rate (HR). MethodsA randomized feasibility trial was conducted with healthy young adults receiving six manual therapy sessions consisting of rotational mobilizations above Th5 over 14 days. Feasibility outcomes included adherence, absence of unexpected adverse events (UAE), and practicality of autonomic data acquisition. Physiological outcomes comprised HRV parameters, high-frequency (HF), low-frequency/high-frequency ratio (LF/HF) and HR, analyzed using autoregressive (AR) and fast Fourier transform (FFT) methods. ResultsProcedural safety and methodological integrity were confirmed (no UAE; complete datasets), but feasibility was only partially achieved due to adherence shortfalls, higher attrition, and device-related delays. Physiologically, large effect sizes were observed in the intervention group: at evening assessment, HF_AR showed ES = 0.80 (p = .008); at morning assessment, HF_FFT ES = 0.72 (p = .016), HF_AR ES = 0.78 (p = .010), and LF/HF_AR ES = 0.70 (p = .021). HR remained unchanged. These findings suggest repeated TSM may modulate HRV, primarily through HF-related changes associated with vagal activity, while LF/HF interpretation remains controversial. ConclusionA randomized TSM protocol is safe and methodologically viable with logistical refinements. Preliminary evidence indicates potential vagal modulation, warranting larger trials with respiratory control, ECG-based HRV, multimodal ANS measures, and clinical populations to confirm efficacy and translational relevance.
Witteveen, D.; Humphreys, D. K.
Show abstract
BackgroundConcern about long-term health effects of repetitive head impacts in football has increased, but it remains unclear whether position-specific exposure patterns were associated with differential long-term all-cause mortality among elite players across the 20th century. MethodsWe conducted two retrospective cohort studies of elite male professional football players. The World Cup cohort included all players on the team rosters from FIFA World Cup tournaments (1930-1990), and the UEFA European Cup cohort included all players who appeared in annual quarterfinal, semifinal, or final matches (1956-1991). Vital status was ascertained through archival linkage. Playing position was harmonized into six categories. Age was the time scale. Cox proportional hazards models were stratified by birth cohort and adjusted for origin region; interaction models were used to estimate region-specific marginal hazard ratios. FindingsThe World Cup cohort included 4,223 players (2,330 deaths), and the European Cup cohort included 2,710 players (1,126 deaths). In the World Cup cohort, goalkeepers had lower mortality than midfielders (hazard ratio [HR] 0.73, 95% CI 0.63-0.84), whereas center-forwards had higher mortality (HR 1.27, 95% CI 1.08-1.50); mortality among center-backs was elevated but not statistically significant (HR 1.13, 95% CI 0.98-1.31). In the European Cup cohort, center-backs (HR 1.28, 95% CI 1.07-1.55) and other defenders (HR 1.20, 95% CI 1.02-1.42) had higher mortality than midfielders. Region-stratified marginal estimates indicated that elevated risks for central playing roles were greatest in Northwestern Europe and Central/Eastern Europe. InterpretationAmong footballers active during the 20th century, long-term all-cause mortality differed by playing position and varied by region, with higher risks concentrated in central attacking and defensive roles. These patterns were most pronounced in regions where aerial contests historically predominated, suggesting that long-term health risks associated with professional football participation vary by role-specific exposure profiles.
Vanegas Mueller, E.; Harford, M.; He, L.; Banerjee, A.; Leeson, P.; Villarroel, M.
Show abstract
Sudden cardiac death risk is 2-3-fold higher in athletes than in non-athletes. We classify sports-related cardiac arrhythmias using a novel explainability framework comprising data analysis, model interpretability, post-hoc visualisation, and systematic assessment. Two neural networks--one with interpretable sinc convolution and one with standard convolution--were trained on general-population ECGs (PhysioNet, n=88,253, 30 arrhythmias, three continents) and tested on professional footballers (PF12RED, n=161) via domain adaptation for normal sinus rhythm (NSR), sinus bradycardia (SB), incomplete right bundle branch block (IRBBB), and T-wave inversion (TWI). Sinc convolution achieved superior NSR detection (AUROC 0.75 vs 0.70), whilst standard convolution excelled at SB (0.74 vs 0.73), IRBBB (0.66 vs 0.58), and TWI (0.59 vs 0.54). Gradient-weighted Class Activation Mapping revealed that sinc models focus on physiologically relevant ECG segments (the PR interval for NSR/SB and the T wave for TWI). We hypothesise that sinc convolution better captures periodic rhythms but struggles with complex morphological patterns, suggesting architectural choice should align with underlying cardiac pathophysiology. Graphical abstractAbbreviations: AI, artificial intelligence; AUPRC, area under the precision-recall curve; AUROC, area under the receiver operating characteristic curve; Conv, convolution; ECG, electrocardiogram; Grad-CAM, gradient-weighted class activation mapping; IAVB, first-degree atrioventricular block; IRBBB, incomplete right bundle branch block; LAD, left axis deviation; LBBB, left bundle branch block; LVH, left ventricular hypertrophy; NSR, normal sinus rhythm; QT, QT interval; RAD, right axis deviation; RBBB, right bundle branch block; RVH, right ventricular hypertrophy; SA, sinus arrhythmia; SB, sinus bradycardia; TWI, T-wave inversion; xAI, explainable artificial intelligence. O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=123 SRC="FIGDIR/small/26346628v1_ufig1.gif" ALT="Figure 1"> View larger version (60K): org.highwire.dtl.DTLVardef@15c80a4org.highwire.dtl.DTLVardef@1c1f2org.highwire.dtl.DTLVardef@1641ee0org.highwire.dtl.DTLVardef@272fec_HPS_FORMAT_FIGEXP M_FIG C_FIG
Liew, B. X. W.; Hu, J.; Altai, Z.; Soliman, A.; Gao, L.; McDonnell, S.; Guo, W.; Maas, S.; Cortes, N.
Show abstract
BackgroundPeople with hip or knee joint arthroplasties are commonly advised to avoid high-impact physical activities, despite increasing demand to return to sport and vigorous exercise. Current implant testing standards do not reflect real-world loading during high-impact tasks, and few studies have quantified implant loads in high-functioning individuals who have returned to such activities. MethodsHigh-functioning adults with a total hip arthroplasty (THA, n = 11), total knee arthroplasty (TKA, n = 4), or unicompartmental knee arthroplasty (UKA, n = 3) performed a range of low-to high-impact activities, including walking, running, hopping, countermovement jumps, landings, and change-of-direction tasks. Three-dimensional trunk and lower-limb kinematics and ground reaction forces were collected. Musculoskeletal modelling was used to quantify three-dimensional hip and knee joint contact forces. Linear mixed-effects models were used to rank implant loads across activities and to compare peak resultant joint loads with healthy controls from a prior study. ResultsFor people with THR, relative to walking, a 45{degrees} change of direction generated the highest predicted hip contact force (8.38 BW, 95% CI 7.70-9.06), followed by running and unilateral hopping (all >1.5x walking, p < 0.05). Unilateral hopping and running produced the highest predicted knee contact force in TKA and UKA participants (8.0-9.1 BW), and both significantly greater than walking (p < 0.05). Compared with healthy controls, THA participants exhibited a lower predicted HCF during walking (-1.58 BW, 95% CI -2.46 to -0.69), but no group differences were observed for running, hopping, or jumping. ConclusionHigh-impact activities vary widely in model-estimated hip and knee contact forces. Several tasks were not substantially higher than walking. These data provide a biomechanical basis for evidence-informed activity prescription, regulatory implant testing, and future computational simulation of implant performance under realistic loading conditions.
Pascoe, M. A.
Show abstract
Purpose: Human anatomy remains foundational to clinical practice, yet reduced instructional hours raise concerns about graduate competence and preparedness for patient care. Although trainees often report confidence, supervisors may perceive deficiencies, creating a gap between self-assessment and external evaluation. This study examined stakeholder perspectives on anatomical competence within physical therapy education to identify areas of discordance in perceived capability. Methods: A cross-sectional web-based survey collected responses from 165 stakeholders associated with an entry-level Doctor of Physical Therapy program featuring a 16-week dissection curriculum. Participants rated four domains of anatomical competence using a 5-point ordinal scale. Group differences were analyzed with the Kruskal-Wallis test appropriate for ordinal data. This methodology ensured robust assessment of stakeholder perceptions and comparative analysis. Results: Median ratings of preparedness and capability were 4 of 5 (quite prepared). Significant discordance emerged in three domains: recent graduates rated their foundational knowledge and ability to explain complex concepts to lay audiences higher than faculty or clinical instructors, whereas faculty expressed lower confidence in graduates' ability to explain patient symptoms using anatomical principles. No significant differences were observed in the ability to describe structures by location, suggesting shared perceptions of basic anatomical understanding despite variation in applied reasoning. Conclusions: Stakeholders generally viewed graduates as well prepared, yet disagreement persisted regarding clinical application of anatomical knowledge. Faculty skepticism about symptom explanation indicates that mastery of anatomy alone does not guarantee clinical reasoning. Curricular strategies emphasizing vertical integration and explicit connections between anatomical science and patient-centered reasoning may help bridge perception gaps and enhance professional competence.
Hernando Redondo, J.; Camps-Vilaro, A.; Elosua, R.; Fornara, E.; Bermudez-Lopez, M.; Toran-Monserrat, P.; Jimenez-Navarro, A.; Valdivielso, J. M.; Lopez-Lifante, V. M.; Salas-Fernandez, T.; Cambray, S.; Cruz, R.; Marrugat De La Iglesia, J.; Hernaez, a.
Show abstract
BackgroundEvidence on how leisure-time physical activity (LTPA) improves lifetime body mass index (BMI) remains fragmented and prone to confounding. MethodsWe pooled 14,993 adults (30-90 y; 52.7% women; cohorts: REGICOR-ACRISC, ILERVAS, ARTPER) with baseline estimated LTPA (moderate-to-vigorous LTPA [MVLTPA] in REGICOR-ACRISC), genotype, and repeated BMI values from electronic health records (1990-2024, 36,157 measures). LTPA was categorized into cohort-specific quartiles; MVLTPA in 0, <100, <200, and [≥]200 METs-min/day. In one-sample Mendelian randomization analyses, we categorized participants in quartiles of a cardiorespiratory fitness polygenic risk score derived from a large GWAS in UK Biobank. Group-dependent BMI trajectories were modeled using spline mixed-effects models. Obesity onset (first BMI [≥]30 kg/m2) was analyzed with IPW-weighted Kaplan-Meier curves and Cox models. ResultsHigher LTPA was associated with slower BMI increases in ages 30-60 (Q1: +0.120 vs Q4: +0.075 kg/m2{middle dot}year), slower declines in ages 70-90 (Q1: -0.143 vs Q4: -0.123 kg/m2{middle dot}year), and lower obesity risk (Q4 vs Q1: HR 0.83, 95% CI 0.72-0.96). Similar trends were observed for MVLTPA. Higher genetically determined cardiorespiratory fitness showed parallel gradients (ages 30-60, Q1: +0.109 vs Q4: +0.101 kg/m2{middle dot}year; ages 70-90, Q1: -0.130 vs Q4: -0.102 kg/m2{middle dot}year) and lower obesity risk (Q4 vs Q1: HR 0.66, 0.56-0.78). Associations were present for women and men separately, but were stronger in men. ConclusionsHigher LTPA and MVLTPA were associated with more favorable lifelong BMI trajectories, delayed obesity risk, and convergent support from Mendelian randomization analyses, supporting a causal protective role of physical activity (in both sexes but stronger in men).
Xie, C.; Wang, Y.; Li, D.; Yu, B.; Peng, S.; Wu, L.; Yang, M.
Show abstract
Handheld ultrasound devices have revolutionized point-of-care diagnostics, but their effectiveness remains limited by operator dependency and the need for specialized training. This paper presents an intelligent guidance and diagnostic assistance system for the handheld wireless ultrasound device, enabling automated carotid artery and thyroid examinations through handheld operation. Drawing inspiration from the Actor-Critic framework, we implement a simulation-based reinforcement learning approach for real-time probe navigation toward standard anatomical views. The system integrates YOLOv8n-based detection networks for carotid plaque and thyroid nodule identification, achieving real-time inference at 30 frames per second. Furthermore, we propose a hybrid measurement approach combining UNet segmentation with the Snake algorithm for precise biometric quantification, including carotid intima-media thickness (IMT), lumen diameter, and lesion dimensions. Experimental validation on clinical datasets demonstrates that the proposed system achieves 91.2% accuracy in standard plane acquisition, 87.5% mean average precision (mAP) for plaque detection, and 89.3% mAP for nodule identification. Measurement results show excellent agreement with expert sonographers, with IMT measurements exhibiting a mean absolute difference of 0.08 mm. These findings demonstrate the feasibility of intelligent handheld ultrasound examination, significantly reducing operator dependency while maintaining diagnostic accuracy comparable to experienced clinicians.
Aoki, K.; Kasai, F.; Komaba, K.; Saito, J.; Yoshikawa, A.; Tashiro, N.; Inoue, H.; Uchibori, K.; Fukazawa, M.
Show abstract
BackgroundIn critically ill patients admitted to the intensive care unit (ICU), rapid skeletal muscle atrophy frequently develops in the acute phase. This ICU-acquired weakness can significantly impair long-term physical function. Although the biceps brachii cross-sectional area (CSA) is commonly used to assess muscle atrophy, its ultrasound imaging can be technically challenging, and the flexor carpi ulnaris may offer a more accessible alternative. Therefore, this study aimed to investigate whether CSA changes of the flexor carpi ulnaris correlate with those of the biceps brachii in critically ill patients admitted to the ICU, as well as whether the flexor carpi ulnaris CSA reflects systemic muscle atrophy in the acute phase of the ICU stay. MethodsTwenty critically ill patients admitted to the ICU underwent serial ultrasound assessment of the biceps brachii and flexor carpi ulnaris CSAs on days 0, 5, 7, and 14 after admission. Longitudinal changes in CSA were analyzed using the Friedman and Wilcoxon signed-rank tests. Correlations between the biceps brachii and flexor carpi ulnaris were examined using Spearmans rank correlation, and structural equation modeling was applied to explore causal relationships between clinical variables and CSA changes. ResultsSignificant CSA reductions were observed in both the flexor carpi ulnaris (-20.6%) and biceps brachii (-16.3%) by day 14, and the relative CSA changes of the biceps brachii and flexor carpi ulnaris showed a moderate positive correlation ({rho} = 0.5489, p = 0.0122). Structural equation modeling analysis revealed that the biceps brachii CSA change had positive effect on that of the flexor carpi ulnaris ({beta} = 0.249, p = 0.0011). Moreover, body mass index was positively associated with the baseline flexor carpi ulnaris CSA ({beta} = 0.042, p = 0.0004). However, the baseline flexor carpi ulnaris CSA was not a significant predictor of subsequent CSA changes. ConclusionUltrasound measurement of the flexor carpi ulnaris CSA offers a practical alternative to that of the biceps brachii for early detection of muscle wasting in ICU patients. Given its anatomical accessibility and high sensitivity to early atrophic changes, it may serve as a feasible screening tool for ICU-acquired weakness and inform timely interventions.
Purssell, H.; Bennett, L.; Mostafa, M.; Landi, S.; Mysko, C.; Hammersley, R.; Patel, M.; Scott, J.; Street, O.; Piper Hanley, K.; The ID LIVER Consortium, ; Hanley, N. A.; Morling, J.; Guha, I. N.; Athwal, V. S.
Show abstract
Background and aimsPopulation screening for liver disease in high-risk groups is recommended. Community diagnosis of liver disease is a challenge due to the asymptomatic nature of disease until very advanced stages. Moreover, regional variation in testing availability can result in people with clinically significant liver disease being missed. Machine learning (ML) has been proposed as a method to reduce diagnostic error and automate screening. We present a novel machine learning derived algorithm (ID LIVER-ML) designed to predict the risk of clinically significant liver disease in a high-risk community population to identify those needing further investigations or specialist referral. MethodsUsing data from 2039 patients recruited to two UK cohorts, we created a parsimonious model using investigations that would be available in primary care using liver stiffness measurement as reference standard. The performance of ID LIVER-ML was compared against FIB-4 score in a second unseen hold out cohort (n=327). ResultsID LIVER-ML performed well at identifying patients at risk of clinically significant liver fibrosis (sensitivity 0.90, Specificity 0.43, PPV 0.54, NPV 0.86, AUC 0.83) and outperformed conventional risk scoring systems (FIB-4: AUC 0.65; NAFLD Fibrosis Score: AUC 0.66; APRI: AUC 0.53; BARD: AUC 0.58). ConclusionMachine learning derived algorithms can help screen high risk populations in a community setting for liver fibrosis. ClinicalTrials.gov ID: NCT04666402 Impact and ImplicationsThe prevalence of steatotic liver disease is rising globally and is an increasingly significant challenge for healthcare systems. Existing risk stratification scores are not validated in a real-world cohort where patients have risk factors for multiple aetiologies of liver disease. Our work shows that a machine learning model can predict the risk of clinically significant liver disease using routine primary care data, better than existing non-invasive risk stratification tools in a real-world cohort. This highlights a potential role for machine learning in the automation of fibrosis risk assessment in primary care. Highlights- Machine learning derived algorithms can predict the risk of clinically significant liver disease in an at risk community population with a mixed aetiology of liver diseases. - The performance of the ML algorithm (ID LIVER-ML) is not affected by metabolic, alcohol, or mixed aetiologies. - ID LIVER-ML outperforms traditional risk stratification scoring systems such as FIB-4 and NAFLD fibrosis scores. - Compared to the FIB-4 score, the use of Machine Learning can reduce the need for secondary care investigations by 59%.
Morris, T. P.; Tinney, E. M.; Toral, S.; O'Brien, A.; Gobena, E.; Hackman, L.; Nwakamma, M. C.; Perko, M. L.; Orchard, E.; Odom, H.; Chen, C.; Hwang, J.; Stillman, A. M.; Kramer, A. F.; Espanya-Irla, G.
Show abstract
BackgroundSedentary behavior is highly prevalent following traumatic brain injury (TBI) and compounds existing risks for cardiovascular, neurodegenerative, and affective disorders. The cognitive and behavioral sequelae of TBI, including impaired decision-making, blunted reward processing, and cognitive fatigue, create particular barriers to adopting and maintaining an active lifestyle. Despite this, effective behavior change interventions targeting physical activity in community-dwelling TBI survivors remain scarce. Here, we evaluated the feasibility, compliance, and preliminary efficacy of a 12-week remotely delivered walking intervention combining planning, behavioral reminders, and monetary micro-incentives. MethodsFifty-six adults aged 40-80 years with a mild-to-moderate TBI diagnosed between 3 months and 15 years prior were randomized to either a planning, reminders, and micro-incentives intervention (n=23) or a health advice control condition (n=25). Participants wore a Fitbit Inspire 3 continuously throughout the study. Intervention participants completed weekly phone calls to plan five 30-minute walks for the following week, received daily text message or email reminders on planned walk days, and earned small monetary incentives upon walk completion. Control participants received weekly health education calls. Feasibility was assessed through recruitment, retention, and adverse event rates. Compliance was assessed via phone call completion rates and Fitbit wear time. Efficacy outcomes included weekly walk counts, walking duration, and step counts, modeled using Poisson generalized linear mixed models and linear mixed-effects models over 12 weeks. ResultsForty-eight participants completed the study (retention rate: 84.2%), with high phone call compliance in both groups (intervention: 98.4%; control: 98.1%). Intervention participants completed significantly more walks than controls from week 1 onward (aIRR = 5.33, 95% CI: 2.27-12.5, p < 0.001), with the group difference growing over time (interaction aIRR = 1.09 per week, 95% CI: 1.01-1.17, p = 0.029). Estimated marginal means indicated that intervention participants completed 5.5 times more walks than controls at week 1, increasing to 15.5 times more by week 12. The intervention group also walked significantly longer at week 1 (b = 62.14 min, 95% CI: 1.05-123.23, p = .046), with the advantage growing over time; by week 12, intervention participants walked 5.3 times longer than controls. Similarly, the intervention group accumulated significantly more steps during walks at week 1 (b = 4,779 steps, 95% CI: 45.50-9,513.00, p = .048), accumulating 3.1 times more steps than controls by week 12. ConclusionsA remotely delivered, multicomponent walking intervention targeting planning, behavioral reminders, and micro-incentives was feasible, well-tolerated, and produced meaningful increases in walking activity in community-dwelling adults with TBI. With high retention and compliance, and consistent effects on walk counts, duration, and steps across the intervention period, these findings provide compelling support for a larger, fully powered trial.